On Adaptivity Of BlockShrink Wavelet Estimator Over Besov Spaces
نویسندگان
چکیده
Cai(1996b) proposed a wavelet method, BlockShrink, for estimating regression functions of unknown smoothness from noisy data by thresholding empirical wavelet co-eecients in groups rather than individually. The BlockShrink utilizes the information about neighboring wavelet coeecients and thus increases the estimation accuracy of the wavelet coeecients. In the present paper, we ooer insights into the BlockShrink procedure and show that the minimax optimality of the BlockShrink estimators holds broadly over a wide range of Besov classes B p;q (M). We prove that the BlockShrink estimators attain the exact optimal rate of convergence over a wide interval of Besov classes with p 2; and the BlockShrink estimators achieves the optimal convergence rate within a logarithmic factor over the Besov classes with p < 2. We also show that the BlockShrink estimators enjoys a smoothness property: if the underlying function is the zero function, then, with high probability, the BlockShrink is also the zero function. Thus the BlockShrink procedure removes pure noise completely.
منابع مشابه
Nonparametric Regression in Exponential Families
Most results in nonparametric regression theory are developed only for the case of additive noise. In such a setting many smoothing techniques including wavelet thresholding methods have been developed and shown to be highly adaptive. In this paper we consider nonparametric regression in exponential families which include, for example, Poisson regression, binomial regression, and gamma regressi...
متن کاملAdaptive Wavelet Estimation: A Block Thresholding And Oracle Inequality Approach
We study wavelet function estimation via the approach of block thresholding and ideal adaptation with oracle. Oracle inequalities are derived and serve as guides for the selection of smoothing parameters. Based on an oracle inequality and motivated by the data compression and localization properties of wavelets, an adaptive wavelet estimator for nonparametric regression is proposed and the opti...
متن کاملNonparametric Regression in Natural Exponential Families
Abstract: Theory and methodology for nonparametric regression have been particularly well developed in the case of additive homoscedastic Gaussian noise. Inspired by asymptotic equivalence theory, there have been ongoing efforts in recent years to construct explicit procedures that turn other function estimation problems into a standard nonparametric regression with Gaussian noise. Then in prin...
متن کاملRates of Convergence and Adaptation over Besov Spaces under Pointwise Risk
Function estimation over the Besov spaces under pointwise r (1 ≤ r < ∞) risks is considered. Minimax rates of convergence are derived using a constrained risk inequality and wavelets. Adaptation under pointwise risks is also considered. Sharp lower bounds on the cost of adaptation are obtained and are shown to be attainable by a wavelet estimator. The results demonstrate important differences b...
متن کاملOn Block Thresholding in Wavelet Regression with Long Memory Correlated Noise
Johnstone and Silverman (1997) and Johnstone (1999) described a level-dependent thresholding method for extracting signals from both shortand long range dependent noise in the wavelet domain structure. It is shown that their Stein unbiased risk estimators (SURE) attain the exact optimal convergence rates in a wide range of Besov balls in certain asymptotic models of standard sample-data models....
متن کامل